An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
نویسندگان
چکیده
We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line-search procedure, and achieves the optimal iteration complexity for such composite optimization problems. In case the strong convexity parameter is unknown, we also develop an adaptive scheme that can automatically estimate it on the fly, at the cost of a slightly worse iteration complexity. Then we focus on the special case of solving the l1-regularized least-squares problem in the high-dimensional setting. In such a context, the smooth part of the objective (least-squares) is not strongly convex over the entire domain. Nevertheless, we can exploit its restricted strong convexity over sparse vectors using the adaptive APG method combined with a homotopy continuation scheme. We show that such a combination leads to a global geometric rate of convergence, and the overall iteration complexity has a weaker dependency on the restricted condition number than previous work.
منابع مشابه
An Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
We first propose an adaptive accelerated proximal gradient (APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the l1regularized least-squares (l1-LS) problem in the high-dimensio...
متن کاملA Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
We consider solving the 1-regularized least-squares ( 1-LS) problem in the context of sparse recovery for applications such as compressed sensing. The standard proximal gradient method, also known as iterative soft-thresholding when applied to this problem, has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often exhibits f...
متن کاملA Proximal-Gradient Homotopy Method for the L1-Regularized Least-Squares Problem
We consider the `1-regularized least-squares problem for sparse recovery and compressed sensing. Since the objective function is not strongly convex, standard proximal gradient methods only achieve sublinear convergence. We propose a homotopy continuation strategy, which employs a proximal gradient method to solve the problem with a sequence of decreasing regularization parameters. It is shown ...
متن کاملAccelerated Block Coordinate Proximal Gradients with Applications in High Dimensional Statistics
Nonconvex optimization problems arise in different research fields and arouse lots of attention in signal processing, statistics and machine learning. In this work, we explore the accelerated proximal gradient method and some of its variants which have been shown to converge under nonconvex context recently. We show that a novel variant proposed here, which exploits adaptive momentum and block ...
متن کاملAccelerated Sparse Linear Regression via Random Projection
In this paper, we present an accelerated numerical method based on random projection for sparse linear regression. Previous studies have shown that under appropriate conditions, gradient-based methods enjoy a geometric convergence rate when applied to this problem. However, the time complexity of evaluating the gradient is as large as O(nd), where n is the number of data points and d is the dim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Comp. Opt. and Appl.
دوره 60 شماره
صفحات -
تاریخ انتشار 2014